6 research outputs found

    PIM-Enclave: Bringing Confidential Computation Inside Memory

    Full text link
    Demand for data-intensive workloads and confidential computing are the prominent research directions shaping the future of cloud computing. Computer architectures are evolving to accommodate the computing of large data better. Protecting the computation of sensitive data is also an imperative yet challenging objective; processor-supported secure enclaves serve as the key element in confidential computing in the cloud. However, side-channel attacks are threatening their security boundaries. The current processor architectures consume a considerable portion of its cycles in moving data. Near data computation is a promising approach that minimizes redundant data movement by placing computation inside storage. In this paper, we present a novel design for Processing-In-Memory (PIM) as a data-intensive workload accelerator for confidential computing. Based on our observation that moving computation closer to memory can achieve efficiency of computation and confidentiality of the processed information simultaneously, we study the advantages of confidential computing \emph{inside} memory. We then explain our security model and programming model developed for PIM-based computation offloading. We construct our findings into a software-hardware co-design, which we call PIM-Enclave. Our design illustrates the advantages of PIM-based confidential computing acceleration. Our evaluation shows PIM-Enclave can provide a side-channel resistant secure computation offloading and run data-intensive applications with negligible performance overhead compared to baseline PIM model

    Capacity: Cryptographically-Enforced In-Process Capabilities for Modern ARM Architectures (Extended Version)

    Full text link
    In-process compartmentalization and access control have been actively explored to provide in-place and efficient isolation of in-process security domains. Many works have proposed compartmentalization schemes that leverage hardware features, most notably using the new page-based memory isolation feature called Protection Keys for Userspace (PKU) on x86. Unfortunately, the modern ARM architecture does not have an equivalent feature. Instead, newer ARM architectures introduced Pointer Authentication (PA) and Memory Tagging Extension (MTE), adapting the reference validation model for memory safety and runtime exploit mitigation. We argue that those features have been underexplored in the context of compartmentalization and that they can be retrofitted to implement a capability-based in-process access control scheme. This paper presents Capacity, a novel hardware-assisted intra-process access control design that embraces capability-based security principles. Capacity coherently incorporates the new hardware security features on ARM that already exhibit inherent characteristics of capability. It supports the life-cycle protection of the domain's sensitive objects -- starting from their import from the file system to their place in memory. With intra-process domains authenticated with unique PA keys, Capacity transforms file descriptors and memory pointers into cryptographically-authenticated references and completely mediates reference usage with its program instrumentation framework and an efficient system call monitor. We evaluate our Capacity-enabled NGINX web server prototype and other common applications in which sensitive resources are isolated into different domains. Our evaluation shows that Capacity incurs a low-performance overhead of approximately 17% for the single-threaded and 13.54% for the multi-threaded webserver.Comment: Accepted at ACM CCS 202

    Flood hazard and resilience in the watershed Nhat Le–Kien Giang in Vietnam

    No full text
    Flood is the most dangerous of natural disasters. It is responsible for considerable loss of life and economic damage, especially in Southeast Asian countries such as Vietnam. Improving community resilience can both reduce damage and alleviate the economic burden of communities in the aftermath of floods. Flood damage reduction strategies in Vietnam mainly focus on flood hazard and often neglect to take into account community resilience. The objective of this article is the development of a theoretical framework to assess the relationship between flood hazard and community resilience in the Nhat Le-Kien Giang River watershed. Flood hazard was calculated by integrate flood depth and flood velocity applying MIKE FLOOD modeling and the AHP , while data on the resilience of communities was collected via questionnaire. A total of 70 households in Hong Thuy commune were selected by the random sampling technique. The results showed that the high-hazard region is concentrated along the river like Tan Ninh, Gia Ninh, Hong Thuy, and Hoa Thuy communes. It also illustrated how access to resources, prevention strategies, and community perceptions are important factors to improve the resilience of communities. The results represent an important theoretical framework that can be used to support decision-makers in building appropriate damage-reduction strategies

    An Efficient Hardware Implementation of Residual Data Binarization in HEVC CABAC Encoder

    No full text
    HEVC-standardized encoders employ the CABAC (context-based adaptive binary arithmetic coding) to achieve high compression ratios and video quality that supports modern real-time high-quality video services. Binarizer is one of three main blocks in a CABAC architecture, where binary symbols (bins) are generated to feed the binary arithmetic encoder (BAE). The residual video data occupied an average of 75% of the CABAC’s work-load, thus its performance will significantly contribute to the overall performance of whole CABAC design. This paper proposes an efficient hardware implementation of a binarizer for CABAC that focuses on low area cost, low power consumption while still providing enough bins for high-throughput CABAC. On the average, the proposed design can process upto 3.5 residual syntax elements (SEs) per clock cycle at the maximum frequency of 500 MHz with an area cost of 9.45 Kgates (6.41 Kgates for the binarizer core) and power consumption of 0.239 mW (0.184 mW for the binarizer core) with NanGate 45 nm technology. It shows that our proposal achieved a high overhead-efficiency of 1.293 Mbins/Kgate/mW, much better than the other related high performance designs. In addition, our design also achieved a high power-efficiency of 8288 Mbins/mW; this is important factor for handheld applications

    Evaluating the effects of climate and land use change on the future flood susceptibility in the central region of Vietnam by integrating land change modeler, machine learning methods

    No full text
    The crucial importance of land cover and use changes and climate changes for worldwide sustainability results from their negative effects on flood risk. In a watershed, a particularly important research question concerning the relationship between land use and climate change and the flood risk is the subject of controversy in the literature. This study aims to assess the effects of land use and climate change on the flood susceptibility in the watershed Nhat Le–Kien Giang, Vietnam using machine learning and Land Change Modeler. The results show that Social Ski Driver Optimization (SSD), Fruit Fly Optimization (FFO), Sailfish Optimization (SFO), and Particle Swarm Optimization (PSO) successfully improve the Support Vector Machine (SVM) model's performance, with a value of the Area Under the Receiver Operating Characteristic curve (AUC) > 0.96. Among them, the SVM-FFO model was better with the value of AUC of 0.984, followed by SVM-SFO (AUC = 0.983), SVM-SSD (AUC = 0.98), SVM-PSO (AUC = 0.97), respectively. In addition, the areas with high and very high flood susceptibility in the study area increased by about 30 km2 from 2020 to 2050 with the SVM-FFO model. Our results underline the consequences of unplanned development. Thus, by applying the theoretical framework of this study, decision makers can take sound more planning measures, such as avoiding construction in areas often affected by floods, etc. Although in this study flood susceptibility is studied in a Central Coast province, the results can be applied to other rapidly developing and flood-prone provinces of Vietnam

    Predicting Future Urban Flood Risk Using Land Change and Hydraulic Modeling in a River Watershed in the Central Province of Vietnam

    No full text
    International audienceFlood risk is a significant challenge for sustainable spatial planning, particularly concerning climate change and urbanization. Phrasing suitable land planning strategies requires assessing future flood risk and predicting the impact of urban sprawl. This study aims to develop an innovative approach combining land use change and hydraulic models to explore future urban flood risk, aiming to reduce it under different vulnerability and exposure scenarios. SPOT-3 and Sentinel-2 images were processed and classified to create land cover maps for 1995 and 2019, and these were used to predict the 2040 land cover using the Land Change Modeler Module of Terrset. Flood risk was computed by combining hazard, exposure, and vulnerability using hydrodynamic modeling and the Analytic Hierarchy Process method. We have compared flood risk in 1995, 2019, and 2040. Although flood risk increases with urbanization, population density, and the number of hospitals in the flood plain, especially in the coastal region, the area exposed to high and very high risks decreases due to a reduction in poverty rate. This study can provide a theoretical framework supporting climate change related to risk assessment in other metropolitan regions. Methodologically, it underlines the importance of using satellite imagery and the continuity of data in the planning-related decision-making process
    corecore